92 research outputs found

    General distributions in process algebra

    Get PDF

    Boosting Fault Tree Analysis by Formal Methods

    Get PDF

    Code Generation = A* + BURS

    Get PDF
    A system called BURS that is based on term rewrite systems and a search algorithm A* are combined to produce a code generator that generates optimal code. The theory underlying BURS is re-developed, formalised and explained in this work. The search algorithm uses a cost heuristic that is derived from the termrewrite system to direct the search. The advantage of using a search algorithm is that we need to compute only those costs that may be part of an optimal rewrite sequence

    Analysis of Internal Boundaries and Transition Regions in Geophysical Systems with Advanced Processing Techniques

    Get PDF
    This thesis examines the utility of the Rényi entropy (RE), a measure of the complexity of probability density functions, as a tool for finding physically meaningful patterns in geophysical data. Initially, the RE is applied to observational data of long-lived atmospheric tracers in order to analyse the dynamics of stratospheric transitions regions associated with barriers to horizontal mixing. Its wider applicability is investigated by testing the RE as a method for highlighting internal boundaries in snow and ice from ground penetrating radar (GPR) recordings. High-resolution 500 MHz GPR soundings of dry snow were acquired at several sites near Scott Base, Antarctica, in 2008 and 2009, with the aim of using the RE to facilitate the identification and tracking of subsurface layers to extrapolate point measurements of accumulation from snow pits and firn cores to larger areas. The atmospheric analysis focuses on applying the RE to observational tracer data from the EOS-MLS satellite instrument. Nitrous oxide (N2O) is shown to exhibit subtropical RE maxima in both hemispheres. These peaks are a measure of the tracer gradients that mark the transition between the tropics and the mid-latitudes in the stratosphere, also referred to as the edges of the tropical pipe. The RE maxima are shown to be located closer to the equator in winter than in summer. This agrees well with the expected behaviour of the tropical pipe edges and is similar to results reported by other studies. Compared to other stratospheric mixing metrics, the RE has the advantage that it is easy to calculate as it does not, for example, require conversion to equivalent latitude and does not rely on dynamical information such as wind fields. The RE analysis also reveals occasional sudden poleward shifts of the southern hemisphere tropical pipe edge during austral winter which are accompanied by increased mid-latitude N2O levels. These events are investigated in more detail by creating daily high-resolution N2O maps using a two-dimensional trajectory model and MERRA reanalysis winds to advect N2O observations forwards and backwards in time on isentropic surfaces. With the aid of this ‘domain filling’ technique it is illustrated that the increase in southern hemisphere mid-latitude N2O during austral winter is probably the result of the cumulative effect of several large-scale, episodic leaks of N2O-rich air from the tropical pipe. A comparison with the global distribution of potential vorticity strongly suggests that irreversible mixing related to planetary wave breaking is the cause of the leak events. Between 2004 and 2011 the large-scale leaks are shown to occur approximately every second year and a connection to the equatorial quasi-biennial oscillation is found to be likely, though this cannot be established conclusively due to the relatively short data set. Identification and tracking of subsurface boundaries, such as ice layers in snow or the bedrock of a glacier, is the focus of the cryospheric part of this project. The utility of the RE for detecting amplitude gradients associated with reflections in GPR recordings is initially tested on a 25 MHz sounding of an Antarctic glacier. The results show distinct regions of increased RE values that allow identification of the glacial bedrock along large parts of the profile. Due to the low computational requirements, the RE is found to be an effective pseudo gain function for initial analysis of GPR data in the field. While other gain functions often have to be tuned to give a good contrast between reflections and background noise over the whole vertical range of a profile, the RE tends to assign all detectable amplitude gradients a similar (high) value, resulting in a clear contrast between reflections and background scattering. Additionally, theoretical considerations allow the definition of a ‘standard’ data window size with which the RE can be applied to recordings made by most pulsed GPR systems and centre frequencies. This is confirmed by tests with higher frequency recordings (50 and 500 MHz) acquired on the McMurdo Ice Shelf. However, these also reveal that the RE processing is less reliable for identifying more closely spaced reflections from internal layers in dry snow. In order to complete the intended high-resolution analysis of accumulation patterns by tracking internal snow layers in the 500 MHz data from two test sites, a different processing approach is developed. Using an estimate of the emitted waveform from direct measurement, deterministic deconvolution via the Fourier domain is applied to the high-resolution GPR data. This reveals unambiguous reflection horizons which can be observed in repeat measurements made one year apart. Point measurements of average accumulation from snow pits and firn cores are extrapolated to larger areas by identifying and tracking a dateable dust layer horizon in the radargrams. Furthermore, it is shown that annual compaction rates of snow can be estimated by tracking several internal reflection horizons along the deconvolved radar profiles and calculating the average change in separation of horizon pairs from one year to the next. The technique is complementary to point measurements from other studies and the derived compaction rates agree well with published values and theoretical estimates

    Causal ambiguity and partial orders in event structures

    Get PDF
    Event structure models often have some constraint which ensures that for each\ud system run it is clear what are the causal predecessors of an event (i.e. there is no causal ambiguity). In this contribution we study what happens if we remove\ud such constraints. We define five different partial order semantics that are intentional in the sense that they refer to syntactic aspects of the model. We also define an observational partial order semantics, that derives a partial order from just the event traces. It appears that this corresponds to the so-called early intentional semantics; the other intentional semantics cannot be observationally characterized. We study the equivalences induced by the different partial order definitions, and their interrelations

    Evaluation and Improvement of Procurement Process with Data Analytics

    Get PDF
    This paper presents a compositional framework for the modeling of interactive continuous-time Markov chains with time-dependent rates, a subclass of communicating piecewise deterministic Markov processes. A poly-time algorithm is presented for computing the coarsest quotient under strong bisimulation for rate functions that are either piecewise uniform or (piecewise) polynomial. Strong as well as weak bisimulation are shown to be congruence relations for the compositional framework, thus allowing component-wise minimization. In addition, a new characterization of transient probabilities in time-inhomogeneous Markov chains with piecewise uniform rates is provided

    Towards a logic for performance and mobility

    Get PDF
    Klaim is an experimental language designed for modeling and programming distributed systems composed of mobile components where distribution awareness and dynamic system architecture configuration are key issues. StocKlaim [R. De Nicola, D. Latella, and M. Massink. Formal modeling and quantitative analysis of KLAIM-based mobile systems. In ACM Symposium on Applied Computing (SAC). ACM Press, 2005. Also available as Technical Report 2004-TR-25; CNR/ISTI, 2004] is a Markovian extension of the core subset of Klaim which includes process distribution, process mobility, asynchronous communication, and site creation. In this paper, MoSL, a temporal logic for StocKlaim is proposed which addresses and integrates the issues of distribution awareness and mobility and those concerning stochastic behaviour of systems. The satisfiability relation is formally defined over labelled Markov chains. A large fragment of the proposed logic can be translated to action-based CSL for which efficient model-checkers exist. This way, such model-checkers can be used for the verification of StocKlaim models against MoSL properties. An example application is provided in the present paper

    Smart railroad maintenance engineering with stochastic model checking

    Get PDF
    RAMS (reliability, availability, maintenance and safety) requirements are of utmost important for safety-critical systems like railroad infrastructure and signaling systems. Fault tree analysis (FTA) is a widely applied industry standard for RAMS analysis and is often one of the techniques preferred by railways organizations. FTA yields system availability and reliability, and can be used for critical path analysis. It can however not yet deal with a pressing aspect of railroad engineering: maintenance. While railroad infrastructure providers are focusing more and more on managing cost/performance ratios, RAMS can be considered as the performance specification, and maintenance the main cost driver. Methods facilitating the management of this ratio are still very uncommon. This paper presents a powerful, flexible and transparent technique to incorporate maintenance aspects in fault tree analysis, based on stochastic model checking. The analysis and comparison of different maintenance strategies (such as age-based, clockbased and condition-dependent maintenance) and their impact on reliability and availability metrics are thus enabled. Thus, the trade off between cost and RAMS performance is facilitated. To keep the underlying state space small, two aggressive state space reduction techniques are employed namely: compositional aggregation and smart semantics. The approach presented is illustrated using several existing, large fault tree models in a case study from Movares, a major RAMS consultancy firm in the Netherlands
    corecore